1,305 research outputs found

    Automorphisms of complex reflection groups

    Get PDF
    Let G\subset\GL(\BC^r) be a finite complex reflection group. We show that when GG is irreducible, apart from the exception G=\Sgot_6, as well as for a large class of non-irreducible groups, any automorphism of GG is the product of a central automorphism and of an automorphism which preserves the reflections. We show further that an automorphism which preserves the reflections is the product of an element of N_{\GL(\BC^r)}(G) and of a "Galois" automorphism: we show that \Gal(K/\BQ), where KK is the field of definition of GG, injects into the group of outer automorphisms of GG, and that this injection can be chosen such that it induces the usual Galois action on characters of GG, apart from a few exceptional characters; further, replacing if needed KK by an extension of degree 2, the injection can be lifted to \Aut(G), and every irreducible representation admits a model which is equivariant with respect to this lifting. Along the way we show that the fundamental invariants of GG can be chosen rational

    An empirical Bayes procedure for the selection of Gaussian graphical models

    Full text link
    A new methodology for model determination in decomposable graphical Gaussian models is developed. The Bayesian paradigm is used and, for each given graph, a hyper inverse Wishart prior distribution on the covariance matrix is considered. This prior distribution depends on hyper-parameters. It is well-known that the models's posterior distribution is sensitive to the specification of these hyper-parameters and no completely satisfactory method is registered. In order to avoid this problem, we suggest adopting an empirical Bayes strategy, that is a strategy for which the values of the hyper-parameters are determined using the data. Typically, the hyper-parameters are fixed to their maximum likelihood estimations. In order to calculate these maximum likelihood estimations, we suggest a Markov chain Monte Carlo version of the Stochastic Approximation EM algorithm. Moreover, we introduce a new sampling scheme in the space of graphs that improves the add and delete proposal of Armstrong et al. (2009). We illustrate the efficiency of this new scheme on simulated and real datasets

    On computational tools for Bayesian data analysis

    Full text link
    While Robert and Rousseau (2010) addressed the foundational aspects of Bayesian analysis, the current chapter details its practical aspects through a review of the computational methods available for approximating Bayesian procedures. Recent innovations like Monte Carlo Markov chain, sequential Monte Carlo methods and more recently Approximate Bayesian Computation techniques have considerably increased the potential for Bayesian applications and they have also opened new avenues for Bayesian inference, first and foremost Bayesian model choice.Comment: This is a chapter for the book "Bayesian Methods and Expert Elicitation" edited by Klaus Bocker, 23 pages, 9 figure

    Importance sampling methods for Bayesian discrimination between embedded models

    Full text link
    This paper surveys some well-established approaches on the approximation of Bayes factors used in Bayesian model choice, mostly as covered in Chen et al. (2000). Our focus here is on methods that are based on importance sampling strategies rather than variable dimension techniques like reversible jump MCMC, including: crude Monte Carlo, maximum likelihood based importance sampling, bridge and harmonic mean sampling, as well as Chib's method based on the exploitation of a functional equality. We demonstrate in this survey how these different methods can be efficiently implemented for testing the significance of a predictive variable in a probit model. Finally, we compare their performances on a real dataset

    Bayesian Core: The Complete Solution Manual

    Full text link
    This solution manual contains the unabridged and original solutions to all the exercises proposed in Bayesian Core, along with R programs when necessary.Comment: 118+vii pages, 21 figures, 152 solution

    Bounding rare event probabilities in computer experiments

    Full text link
    We are interested in bounding probabilities of rare events in the context of computer experiments. These rare events depend on the output of a physical model with random input variables. Since the model is only known through an expensive black box function, standard efficient Monte Carlo methods designed for rare events cannot be used. We then propose a strategy to deal with this difficulty based on importance sampling methods. This proposal relies on Kriging metamodeling and is able to achieve sharp upper confidence bounds on the rare event probabilities. The variability due to the Kriging metamodeling step is properly taken into account. The proposed methodology is applied to a toy example and compared to more standard Bayesian bounds. Finally, a challenging real case study is analyzed. It consists of finding an upper bound of the probability that the trajectory of an airborne load will collide with the aircraft that has released it.Comment: 21 pages, 6 figure

    Maximin design on non hypercube domain and kernel interpolation

    Get PDF
    In the paradigm of computer experiments, the choice of an experimental design is an important issue. When no information is available about the black-box function to be approximated, an exploratory design have to be used. In this context, two dispersion criteria are usually considered: the minimax and the maximin ones. In the case of a hypercube domain, a standard strategy consists of taking the maximin design within the class of Latin hypercube designs. However, in a non hypercube context, it does not make sense to use the Latin hypercube strategy. Moreover, whatever the design is, the black-box function is typically approximated thanks to kernel interpolation. Here, we first provide a theoretical justification to the maximin criterion with respect to kernel interpolations. Then, we propose simulated annealing algorithms to determine maximin designs in any bounded connected domain. We prove the convergence of the different schemes.Comment: 3 figure

    A fully objective Bayesian approach for the Behrens-Fisher problem using historical studies

    Full text link
    For in vivo research experiments with small sample sizes and available historical data, we propose a sequential Bayesian method for the Behrens-Fisher problem. We consider it as a model choice question with two models in competition: one for which the two expectations are equal and one for which they are different. The choice between the two models is performed through a Bayesian analysis, based on a robust choice of combined objective and subjective priors, set on the parameters space and on the models space. Three steps are necessary to evaluate the posterior probability of each model using two historical datasets similar to the one of interest. Starting from the Jeffreys prior, a posterior using a first historical dataset is deduced and allows to calibrate the Normal-Gamma informative priors for the second historical dataset analysis, in addition to a uniform prior on the model space. From this second step, a new posterior on the parameter space and the models space can be used as the objective informative prior for the last Bayesian analysis. Bayesian and frequentist methods have been compared on simulated and real data. In accordance with FDA recommendations, control of type I and type II error rates has been evaluated. The proposed method controls them even if the historical experiments are not completely similar to the one of interest
    • …
    corecore